Abstract:In the existing knowledge completion models with textual or neighbor information, the interaction between texts and neighbors is ignored. Therefore, it is difficult to capture the information with strong semantic relevance to entities. In addition, the relationship-specific information in the entities is not taken into account in the models based on convolutional neural networks, which results in poor prediction performance. In this paper, a circular convolutional neural network model based on triplet attention is proposed combining textual and neighbor information. Firstly, the words with strong semantic relevance to entities in textual descriptions are selected by semantic matching, and then they are combined with topological neighbors as entity neighbors to enhance entity representations. Next, the fusion representations of the entity and the relation representations are reshaped. Finally, the triplet attention is utilized to optimize the input of the convolution and the convolution operation can extract the features related to the relations in the entities, which improves the model performance. Experiments on several public datasets show that the performance of the proposed model is superior.
[1] ZHANG F Z, YUAN N J, LIAN D F, et al. Collaborative Know-ledge Base Embedding for Recommender Systems//Proc of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. New York, USA: ACM, 2016: 353-362. [2] BERANT J, CHOU A, FROSTIG R, et al. Semantic Parsing on Freebase from Question-Answer Pairs//Proc of the Conference on Empirical Methods in Natural Language Processing. Stroudsburg, USA: ACL, 2013: 1533-1544. [3] BORDES A, CHOPRA S, WESTON J.Question Answering with Subgraph Embeddings//Proc of the Conference on Empirical Me-thods in Natural Language Processing. Stroudsburg, USA: ACL, 2014: 615-620. [4] SUCHANEK F M, KASNECI G, WEIKUM G.YAGO: A Core of Semantic Knowledge//Proc of the 16th International Conference on World Wide Web. New York, USA: ACM, 2007: 697-706. [5] BOLLACKER K, EVANS C, PARITOSH P, et al. Freebase: A Collaboratively Created Graph Database for Structuring Human Knowledge//Proc of the ACM SIGMOD International Conference on Management of Data. New York, USA: ACM, 2008: 1247-1249. [6] AUER S, BIZER C, KOBILAROV G, et al. DBpedia: A Nucleus for a Web of Open Data//Proc of the International Semantic Web Conference. Berlin, Germany: Springer, 2007: 722-735. [7] WEST R, GABRILOVICH E, MURPHY K, et al. Knowledge Base Completion via Search-Based Question Answering//Proc of the 23rd International Conference on World Wide Web. New York, USA: ACM, 2014: 515-526. [8] BENGIO Y, COURVILLE A, VINCENT P.Representation Lear-ning: A Review and New Perspectives. IEEE Transactions on Pa-ttern Analysis and Machine Intelligence, 2013, 35(8): 1798-1828. [9] BORDES A, USUNIER N, GARCIA-DURÁN A, et al. Translating Embeddings for Modeling Multi-relational Data//Proc of the 26th International Conference on Neural Information Processing Systems. Cambridge, USA: The MIT Press, 2013, II: 2787-2795. [10] SUN Z Q, DENG Z H, NIE J Y, et al. RotatE: Knowledge Graph Embedding by Relational Rotation in Complex Space[C/OL].[2021-08-20]. https://openreview.net/references/pdf?id=BJqafQvBE. [11] YANG B S, YIH W T, HE X D, ,et al. Embedding Entities. Embedding Entities and Relations for Learning and Inference in Knowledge Bases[C/OL]. [2021-08-20]. https://arxiv.org/pdf/1412.6575.pdf. [12] TROUILLON T, WELBL J, RIEDEL S, et al. Complex Embe-ddings for Simple Link Prediction//Proc of the 33rd International Conference on Machine Learning. New York, USA: ACM, 2016: 2071-2080. [13] DETTMERS T, MINERVINI P, STENETORP P, et al. Convolutional 2D Knowledge Graph Embeddings//Proc of the 32nd AAAI Conference on Artificial Intelligence. Palo Alto, USA: AAAI, 2018: 1811-1818. [14] VASHISHTH S, SANYAL S, NITIN V, et al. InteractE: Improving Convolution-Based Knowledge Graph Embeddings by Increa-sing Feature Interactions. Proceedings of the AAAI Conference on Artificial Intelligence, 2020, 34(3): 3009-3016. [15] YAO L, MAO C S, LUO Y.KG-BERT: BERT for Knowledge Graph Completion[C/OL]. [2021-08-20].https://arxiv.org/pdf/1909.03193.pdf. [16] FAN M, ZHOU Q, ZHENG T F, et al. Distributed Representation Learning for Knowledge Graphs with Entity Descriptions. Pattern Recognition Letters, 2017, 93: 31-37. [17] VEIRA N, KENG B, PADMANABHAN K, et al. Unsupervised Embedding Enhancements of Knowledge Graphs Using Textual Associations//Proc of the 28th International Joint Conference on Artificial Intelligence. San Francisco, USA: IJCAI, 2019: 5218-5225. [18] VASHISHTH S, SANYAL S, NITIN V, et al. Composition-Based Multi-relational Graph Convolutional Networks[C/OL].[2021-08-20]. https://openreview.net/attachment?id=BylA_C4tPr&name=original_pdf. [19] KONG F S, ZHANG R C, MAO Y Y, et al. LENA: Locality-Expanded Neural Embedding for Knowledge Base Completion. Proceedings of the AAAI Conference on Artificial Intelligence, 2019, 33(1): 2895-2902. [20] BANSAL T, JUAN D C, RAVI S, et al. A2N: Attending to Neigh-bors for Knowledge Graph Inference//Proc of the 57th Annual Meeting of the Association for Computational Linguistics. Stroudsburg, USA: ACL, 2019: 4387-4392. [21] MISRA D, NALAMADA T, ARASANIPALAI A U, et al. Rotate to Attend: Convolutional Triplet Attention Module//Proc of the IEEE/CVF Winter Conference on Applications of Computer Vision. Washington, USA: IEEE, 2021: 3139-3148. [22] BORDES A, WESTON J, COLLOBERT R, et al. Learning Structured Embeddings of Knowledge Bases//Proc of the 25th AAAI Conference on Artificial Intelligence. Palo Alto, USA: AAAI, 2011: 301-306. [23] XU C J, NAYYERI M, ALKHOURY F, et al. TeRo: A Time-Aware Knowledge Graph Embedding via Temporal Rotation//Proc of the 28th International Conference on Computational Linguistics. Stroudsburg, USA: ACL, 2020: 1583-1593. [24] KIPF T N, WELLING M.Semi-Supervised Classification with Graph Convolutional Networks[C/OL]. [2021-12-03].https://arxiv.org/pdf/1609.02907.pdf. [25] VELI $\check{C}$KOVI$\acute{C}$ P, CUCURULL G, CASANOVA A, et al. Graph Attention Networks[C/OL].[2021-08-20]. https://arxiv.org/pdf/1710.10903.pdf.